AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Child Language Acquisition

# Child Language Acquisition

Babyberta 1
A lightweight RoBERTa variant trained on 5 million American English child-directed corpora, specifically designed for language acquisition research
Large Language Model Transformers English
B
phueb
295
3
Babyberta 3
MIT
BabyBERTa is a lightweight version based on RoBERTa, specifically designed for language acquisition research, trained on a 5-million-word corpus of American English child-directed input.
Large Language Model Transformers English
B
phueb
22
0
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase